Global Convergence of Splitting Methods for Nonconvex Composite Optimization

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Global Convergence of Splitting Methods for Nonconvex Composite Optimization

We consider the problem of minimizing the sum of a smooth function h with a bounded Hessian, and a nonsmooth function. We assume that the latter function is a composition of a proper closed function P and a surjective linear map M, with the proximal mappings of τP , τ > 0, simple to compute. This problem is nonconvex in general and encompasses many important applications in engineering and mach...

متن کامل

An Alternating Proximal Splitting Method with Global Convergence for Nonconvex Structured Sparsity Optimization

In many learning tasks with structural properties, structured sparse modeling usually leads to better interpretability and higher generalization performance. While great efforts have focused on the convex regularization, recent studies show that nonconvex regularizers can outperform their convex counterparts in many situations. However, the resulting nonconvex optimization problems are still ch...

متن کامل

Global Convergence of Langevin Dynamics Based Algorithms for Nonconvex Optimization

We present a unified framework to analyze the global convergence of Langevin dynamics based algorithms for nonconvex finite-sum optimization with n component functions. At the core of our analysis is a direct analysis of the ergodicity of the numerical approximations to Langevin dynamics, which leads to faster convergence rates. Specifically, we show that gradient Langevin dynamics (GLD) and st...

متن کامل

Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization

This paper considers a class of constrained stochastic composite optimization problems whose objective function is given by the summation of a differentiable (possibly nonconvex) component, together with a certain non-differentiable (but convex) component. In order to solve these problems, we propose a randomized stochastic projected gradient (RSPG) algorithm, in which proper mini-batch of samp...

متن کامل

Global Convergence of ADMM in Nonconvex Nonsmooth Optimization

In this paper, we analyze the convergence of the alternating direction method of multipliers (ADMM) for minimizing a nonconvex and possibly nonsmooth objective function, φ(x0, . . . , xp, y), subject to coupled linear equality constraints. Our ADMM updates each of the primal variables x0, . . . , xp, y, followed by updating the dual variable. We separate the variable y from xi’s as it has a spe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Optimization

سال: 2015

ISSN: 1052-6234,1095-7189

DOI: 10.1137/140998135